AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Pre-trained Language Model

# Pre-trained Language Model

Tybert
Apache-2.0
A Turkish BERT model pre-trained by Trendyol, suitable for various natural language understanding tasks.
Large Language Model Transformers Other
T
Trendyol
54
6
Ernie Gram Zh
ERNIE-Gram is a pre-trained natural language understanding model using explicit N-Gram masked language modeling
Large Language Model Transformers Chinese
E
nghuyong
225
5
Chinese Bert Wwm
Apache-2.0
A Chinese pre-trained BERT model using whole word masking strategy, designed to accelerate Chinese natural language processing research.
Large Language Model Chinese
C
hfl
28.52k
79
Tcr Bert Mlm Only
TCR-BERT is a pre-trained model based on the BERT architecture, specifically optimized for T-cell receptor (TCR) sequences through masked amino acid modeling tasks.
Protein Model Transformers
T
wukevin
27
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase